Sequential Adaptive Compressed Sampling via Huffman Codes

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential adaptive compressed sampling via Huffman codes

There are two main approaches in compressed sensing: the geometric approach and the combinatorial approach. In this paper we introduce an information theoretic approach and use results from the theory of Huffman codes to construct a sequence of binary sampling vectors to determine a sparse signal. Unlike other approaches, our approach is adaptive in the sense that each sampling vector depends o...

متن کامل

Level-compressed Huffman decoding

Based on the breadth-first search manner and the level-compression technique, this letter first presents a new array data structure to represent the classical Huffman tree. Then, the decoding algorithm is given. Both the memory and the decoding time required in the proposed method are less than those of previous methods. Some experimentations are carried out to demonstrate the advantages of the...

متن کامل

On the random property of compressed data via Huffman coding

Though Huffman codes [2,3,4,5,9] have shown their power in data compression, there are still some issues that are not noticed. In the present paper, we address the issue on the random property of compressed data via Huffman coding. Randomized computation is the only known method for many notoriously difficult #P-complete problems such as permanent, and some network reliability problems, etc [1,...

متن کامل

Huffman Codes Lecturer : Michel

Shannon’s noiseless coding theorem tells us how compactly we can compress messages in which all letters are drawn independently from an alphabet A and we are given the probability pa of each letter a ∈ A appearing in the message. Shannon’s theorem says that, for random messages with n letters, the expected number of bits we need to transmit is at least nH(p) = −n ∑ a∈A pa log2 pa bits, and ther...

متن کامل

Entropy and Huffman Codes

We will show that • the entropy for a random variable gives a lower bound on the number of bits needed per character for a binary coding • Huffman codes are optimal in the average number of bits used per character among binary codes • the average bits per character used by Huffman codes is close to the entropy of the underlying random variable • one can get arbitrarily close to the entropy of a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Sampling Theory, Signal Processing, and Data Analysis

سال: 2011

ISSN: 2730-5716,2730-5724

DOI: 10.1007/bf03549543